This article originally appeared in The Bar Examiner print edition, Winter 2018-2019 (Vol. 87, No. 4), pp 3-6.

By Judith A. Gundersen

Judith A. Gundersen“Woman Eaten by Tiger During Bar Exam!” “Bar Exam Hardest One Ever!” “Bar Exam Persists in Testing Legal Knowledge!”

In a world with a 24/7 news cycle, false news, and swift competition for attention-grabbing headlines and clickbait, it is tempting to be Attention-Grabbing! Controversial! Shocking! so that stakeholders pay attention.

NCBE is not going to use clickbait or lead our news with any ­attention-grabbing headlines. We are conservative with a small “c.” And I think that is exactly how the public, admissions authorities, and even examinees and law schools would want the entity that produces the lawyer licensing exam to be. Our “headlines” would reflect our faithful adherence to best practices in testing, our reliance on robust measurement expertise and a highly skilled editorial staff, and our collaboration with a vast network of volunteer subject-­matter experts well trained in item writing and reviewing. Our news might focus on the fact that we are constantly reviewing our procedures, conferring with admissions authorities, evaluating our staffing resources, and researching technological solutions to seek ways to improve the development, production, and scoring processes for our examinations. (For a look at the changes made to the bar exam throughout the years, see “NCBE Testing Milestones” on our Testing Task Force website, www.testingtaskforce.org.)

Boring? Perhaps. Important? Yes.

But being “conservative” does not mean being passive or resting on our laurels. It is appropriate, for instance, to periodically study any licensing exam in order to ensure that it continues to test the knowledge and skills needed by newly licensed practitioners to competently practice. So, here are a couple of headlines: “NCBE to Study Bar Exam!” “NCBE Forming Task Force to Undertake a Future-Focused Study of the Knowledge, Skills, and Abilities Needed for Competent 21st-Century Practice!” Hmmm . . . not exactly clickbait.

Licensing exams must stay valid and reliable—they must test what has been determined to be the relevant content domain for the profession (i.e., relevant constructs) and do so in a reproducible fashion so that there is confidence and consistency in the results. The bar exam would not be valid, for instance, if it included testing would-be lawyers on the symptoms of a heart attack or their working knowledge of Spanish; while both may be important to know (as a Spanish major I, for one, found my knowledge of Spanish useful in the practice of law), neither belongs on the exam because they are outside the domain of what minimally competent new lawyers need to know.

According to the Standards for Educational and Psychological Testing, an exam must provide evidence to support the validity of test scores—it must indicate which constructs the test is intended to measure, and demonstrate that the test will indeed measure those constructs. In the world of high-stakes licensure testing, validity is established by empirical evidence of what newly licensed professionals in the field need to know to (at least minimally) competently perform their jobs and connecting this evidence to content and tasks covered on the licensing exam.1

By their very nature and purpose (public protection), licensing exams like the bar exam must be crafted to ensure that those who obtain the professional license can safely practice in their chosen field.

Reliability is also an essential feature of high-stakes tests. While validity pertains to content, reliability pertains to reproducibility of results. If an examinee were to take a reliable test over and over and over again (without obtaining additional knowledge in between tests), one would expect the examinee’s score to be about the same each time. This means that any given score isn’t a one-off but in fact accurately reflects the examinee’s proficiency. This is where the MBE plays a key role—with 175 scored questions, scores are highly reliable and reproducible. If the bar exam consisted of only two questions, for example, a highly prepared examinee could credibly complain that the exam tested the exact two topics he or she did not study. That’s harder to assert with 175 multiple-choice questions, and when those questions are combined with essay and performance test questions, the resulting total score can legitimately claim to be a reliable measure of proficiency.

By their very nature and purpose (public protection), licensing exams like the bar exam must be crafted to ensure that those who obtain the professional license can safely practice in their chosen field. A license to practice law in the United States is a general license, not a specialized one focusing on, say, commercial litigation. So the licensing exam is geared to testing general legal knowledge and skills. That means that an examinee who has no intention of ever appearing in a courtroom might have to draft a closing argument in a performance test or apply the excited utterance exception to the rule against hearsay in an essay or multiple-choice question. Unless and until a license to practice is a specialized one, this will remain the case. Should the bar exam instead be a series of specific licensing exams to reflect how lawyers now practice law—in one particular area? I don’t have the answer to that, except to note that in my hometown (population 8,000), there most certainly are solo practitioners who are generalists. The bar exam must be a test of minimal competence for all aspiring lawyers, whether they go on to practice in a small town or at a big-city law firm.

As jurisdictions were releasing their July 2018 scores, a few online sources for legal news speculated on the cause for the drop in pass rates. The answer is not without some complexity and defies explanation in just a few words. In prior issues of the Bar Examiner, we’ve talked about trends—law school enrollment and LSAT scores for people most at risk of not passing the bar exam, and how these correlate to the pass rates—and also about how the equating process ensures score stability across bar exam administrations. Admittedly, none of these concepts can compare to a headline such as “Pass Rates Eaten by Tiger!” Our news here at NCBE is rarely EXCITING! or clickbait material, but it is fact-based and grounded in empirical evidence and measurement science.

Speaking of exciting, we have undertaken quite a few exciting projects over the course of the past 18 months. To name a few: We announced a new collaboration with the Council on Legal Education Opportunity (CLEO) to increase diversity and inclusion in the legal profession. We appointed our Testing Task Force charged with undertaking a three-year study of the bar exam; the Task Force announced its research plan in November 2018 (see the Testing Task Force Quarterly Update) and has since embarked on phases one and two of its study. It has held multiple input-­gathering stakeholder research sessions, most recently at the Association of American Law Schools (AALS) Annual Meeting, the Uniform Bar Examination Jurisdiction Forum that we held in San Antonio in mid-January, and the ABA Midyear Meeting, and has begun a series of focus groups for subject-­matter experts (entry-level lawyers and lawyers who supervise entry-level lawyers) that will lay the foundation for a future-focused practice analysis survey to be distributed nationwide later this year. We began the process of migrating the MPRE from a paper-based to a computer-­based delivery platform. And we began converting our study aids to a learning management system, which we plan to unveil this year. And, to assist both our in-house psychometricians and our Testing Task Force’s work, we have engaged a Technical Advisory Panel composed of five psychometricians from academia and other testing programs, including licensure exams, to offer independent advice and consultation on all aspects of our testing programs. The Technical Advisory Panel members have impeccable credentials and bring complementary areas of measurement expertise to the table.

[Our] initiatives should instill confidence that we are working to ensure that our exams continue to test the relevant skills, knowledge, and abilities that will be required of newly licensed lawyers in the upcoming decades of the 21st century . . .

These initiatives should instill confidence that we are working to ensure that our exams continue to test the relevant skills, knowledge, and abilities that will be required of newly licensed lawyers in the upcoming decades of the 21st century; that NCBE’s volunteer and staff leadership is tuned in to the broader legal education–­admissions ecosystem and that we value diversity in law school and the profession; that NCBE is committed to democratizing bar preparation with the repurposing and repackaging of released bar exam questions in formats that are device-adaptable, affordable, and geared to a new generation of examinees; and that we are looking at new test delivery platforms that enhance security, standardize testing conditions, and allow better measurement analysis.

Exciting changes have also been taking place at NCBE’s headquarters in Madison, Wisconsin, over the past year. We’re in a period of growth at the Conference that is contributing mightily to our ability to effect change; build strategic alliances with other stakeholders; and of course maintain high-­quality tests and measurement services, investigative services, and educational programming. We have added some new positions that are critical to allowing us to continue to serve bar admitting authorities and the greater admissions community.

In February 2018 we created the position of Chief Strategy Officer filled by Kellie Early. Kellie has been with NCBE since 2010, bringing a wealth of bar exam administration experience as the former executive director of the Missouri Board of Law Examiners. As we embark on new initiatives, defining a strategy with goals is important, and we are engaging in that process at NCBE. Kellie works closely with our Special Projects Manager and our Strategic and Technical Solutions Specialist (both also new positions in 2018).

We’ve also created a Director of Test and Information Security position filled by C. Beth Hill, our former MBE Program Director. Beth is responsible for all aspects of test security, from investigating possible cheating incidents to working with our outside vendor on web patrolling to ensuring that our test administration Supervisor’s Manual and website language reflect security best practices.

We have a new Communications Department, headed by Claire Guback, editor of this publication, in the role of Editorial Director. This department gathers several existing employees and also includes a new Digital Communications Specialist position. You can now follow us on Facebook, Twitter, and LinkedIn.

Our Meetings and Education Department staff has expanded, too, in an effort to improve and broaden our educational programming offerings for bar admissions stakeholders and others in legal education, with Laurie Lutz now serving as Director of Meetings and Education.

Our Test Operations and Testing & Research Departments staffing has been beefed up as well. We’ve added two new administrative positions—one focused on customer service and the other on quantitative and technical assistance to our researchers and psychometricians. We’ve also recently added a test editor to assist with test development for the MPRE, MEE, and MPT.

To keep up with all this growth, our IT Department is also expanding to facilitate project management and business analysis, quality control, and infrastructure improvement and maintenance.

On a last note, in an effort to make an impact on the larger admissions–­legal education landscape, we co-sponsored with the Law School Admission Council (LSAC) a program on assessment fundamentals for legal educators. “Best Practices in High-Stakes Testing: What Legal Educators Need to Know” was held on February 7–9, 2019, in Albuquerque, New Mexico, and was led by measurement staff from both NCBE and LSAC. (Headline: “Validity! Reliability! Equating! Item-Response Theory!” Again, you get the message. Not flashy, but important nonetheless.)

Finally, I’m delighted to report on late-breaking news that Texas has become the 35th jurisdiction to adopt the UBE. It will start administering the UBE in February 2021. We are so excited to welcome all Texas stakeholders to the UBE.

Sending all of you our best wishes for a happy and healthy 2019.

Until the next issue,

Judy

Judith A. Gundersen

Notes

    1. American Educational Research Association, American Psychological Association, and National Council on Measurement in Education, Standards for Educational and Psychological Testing (American Educational Research Association 2014). (Go back)

Contact us to request a pdf file of the original article as it appeared in the print edition.

  • Bar
    Bar Exam Fundamentals

    Addressing questions from conversations NCBE has had with legal educators about the bar exam.

  • Online
    Online Bar Admission Guide

    Comprehensive information on bar admission requirements in all US jurisdictions.

  • NextGen
    NextGen Bar Exam of the Future

    Visit the NextGen Bar Exam website for the latest news about the bar exam of the future.

  • BarNow
    BarNow Study Aids

    NCBE offers high-quality, affordable study aids in a mobile-friendly eLearning platform.

  • 2023
    2023 Year in Review

    NCBE’s annual publication highlights the work of volunteers and staff in fulfilling its mission.

  • 2023
    2023 Statistics

    Bar examination and admission statistics by jurisdiction, and national data for the MBE and MPRE.